23 research outputs found

    Enhancing the Expressivity of the Sensel Morph via Audio-rate Sensing

    Get PDF
    This project describes a novel approach to hybrid electro-acoustical instruments by augmenting the Sensel Morph, with real-time audio sensing capabilities. The actual action-sounds are captured with a piezoelectric transducer and processed in Max 8 to extend the sonic range existing in the acoustical domain alone. The control parameters are captured by the Morph and mapped to audio algorithm proprieties like filter cutoff frequency, frequency shift or overdrive. The instrument opens up the possibility for a large selection of different interaction techniques that have a direct impact on the output sound. The instrument is evaluated from a sound designer’s perspective, encouraging exploration in the materials used as well as techniques. The contribution are two-fold. First, the use of a piezo transducer to augment the Sensel Morph affords an extra dimension of control on top of the offerings. Second, the use of acoustic sounds from physical interactions as a source for excitation and manipulation of an audio processing system offers a large variety of new sounds to be discovered. The methodology involved an exploratory process of iterative instrument making, interspersed with observations gathered via improvisatory trials, focusing on the new interactions made possible through the fusion of audio-rate inputs with the Morph’s default interaction methods

    Resurrecting the Tromba Marina:A Bowed Virtual Reality Instrument using Haptic Feedback and Accurate Physical Modelling

    Get PDF
    This paper proposes a multisensory simulation of a tromba marina – a bowed string instrument in virtual reality. The auditory feedback is generated by an accurate physical model, the haptic feedback is provided by the PHANTOM Omni, and the visual feedback is rendered through an Oculus Rift CV1 head-mounted display (HMD). Moreover, a user study exploring the experience of interacting with a virtual bowed string instrument is presented, as well as evaluating the playability of the system. The study comprises of both qualitative (observations, think aloud and interviews) and quantitative (survey) data collection methods. The results indicate that the implementation was successful, offering participants realistic feedback, as well as a satisfactory multisensory experience, allowing them to use the system as a musical instrument

    Keytar: Melodic control of multisensory feedback from virtual strings

    Get PDF
    A multisensory virtual environment has been designed, aiming at recreating a realistic interaction with a set of vibrating strings. Haptic, auditory and visual cues progressively istantiate the environment: force and tactile feedback are provided by a robotic arm reporting for string reaction, string surface properties, and furthermore defining the physical touchpoint in form of a virtual plectrum embodied by the arm stylus. Auditory feedback is instantaneously synthesized as a result of the contacts of this plectrum against the strings, reproducing guitar sounds. A simple visual scenario contextualizes the plectrum in action along with the vibrating strings. Notes and chords are selected using a keyboard controller, in ways that one hand is engaged in the creation of a melody while the other hand plucks virtual strings. Such components have been integrated within the Unity3D simulation environment for game development, and run altogether on a PC. As also declared by a group of users testing a monophonic Keytar prototype with no keyboard control, the most significant contribution to the realism of the strings is given by the haptic feedback, in particular by the textural nuances that the robotic arm synthesizes while reproducing physical attributes of a metal surface. Their opinion, hence, argues in favor of the importance of factors others than auditory feedback for the design of new musical interfaces

    Multisensory Integration Design in Music for Cochlear Implant Users

    Get PDF
    Cochlear implant (CI) users experience several challenges when listening to music. However, their hearing abilities are greatly diverse and their musical experiences may significantly vary from each other. In this research, we investigate this diversity in CI users' musical experience, preferences, and practices. We integrate multisensory feedback into their listening experiences to support the perception of specific musical features and elements. Four installations are implemented, each exploring a different sensory modality assisting or supporting CI users' listening experience. We study these installations throughout semi-structured and exploratory workshops with participants. We report the results of our process-oriented assessment of CI users' experience with music. Because the CI community is a minority participant group in music, musical instrument design frameworks and practices vary from those of hearing cultures. We share guidelines for designing multisensory integration that derived from our studies with individual CI users and specifically aimed to enrich their experiences

    Tactile displays for auditory augmentation–A scoping review and reflections on music applications for hearing impaired users

    Get PDF
    The field of tactile augmentation has progressed greatly over the past 27 years and currently constitutes an emerging area of research, bridging topics ranging from neuroscience to robotics. One particular area of interest is studying the usage of tactile augmentation to provide inclusive musical experiences for deaf or hard-of-hearing individuals. This article details a scoping review that investigates and organizes tactile displays used for the augmentation of music from the field of hearing assistive devices, documented in 63 scientific publications. The focus is on the hardware, software, mapping, and evaluation of these displays, to identify established methods and techniques, as well as potential gaps in the literature. To achieve this purpose, a catalog of devices was created from the available literature indexed in the Scopus® database. We set up a list of 12 descriptors belonging to physical, auditory, perceptual, purpose and evaluation domains; each tactile display identified was categorized based on those. The frequency of use among these descriptors was analyzed and as well as the eventual relationship between them. Results indicate that the field is relatively new, with 80% of the literature indexed being published after 2009. Moreover, most of the research is conducted in laboratories, with limited industry reach. Most of the studies have low reliability due to small sample sizes, and sometimes low validity due to limited access to the targeted population (e.g., evaluating systems designed for cochlear implant users, on normal hearing individuals). When it comes to the tactile displays, the results show that the hand area is targeted by the majority of the systems, probably due to the higher sensitivity afforded by it, and that there are only a couple of popular mapping systems used by the majority of researchers. Additional aspects of the displays were investigated, including the historical distribution of various characteristics (e.g., number of actuators, or actuators type) as well as the sonic material used as input. Finally, a discussion of the current state of the tactile augmentation of music is presented, as well as suggestions for potential future research

    No Strings Attached: Force and Vibrotactile Feedback in a Guitar Simulation

    Get PDF
    In this paper we propose a multisensory simulation of plucking guitar strings in virtual reality. The auditory feedback is generated by a physics-based simulation of guitar strings, and haptic feedback is provided by a combination of high fidelity vibrotactile actuators and a Phantom Omni haptic device. Moreover, we present a user study (n=29) exploring the perceived realism of the simulation and the relative importance of force and vibrotactile feedback for creating a realistic experience of plucking virtual strings. The study compares four conditions: no haptic feedback, vibrotactile feedback, force feedback, and a combination of force and vibrotactile feedback. The results indicate that the combination of vibrotactile and force feedback eliits the most realistic experience, and during this condition, the participants were less likely to inadvertently hit strings after the intended string had been plucked. Notably, no statistically significant differences were found between the conditions involving either vibrotactile or force feedback, which points towards an indication that haptic feedback is important but does not need to be high fidelity in order to enhance the quality of the experience

    Gestural Control Of Wavefield synthesis

    Get PDF
    (Abstract to follow

    Wavefield Synthesis for Max/MSP

    No full text
    corecore